DSC 140B
Problems tagged with binary cross-entropy

Problems tagged with "binary cross-entropy"

Problem #169

Tags: binary cross-entropy, lecture-16, multiple outputs

A multi-label classifier has 3 output nodes with sigmoid activations. The true labels are \(\vec y = (1, 0, 1)\) and the predicted probabilities are \(\vec h = (0.9, 0.2, 0.8)\).

Compute the binary cross-entropy loss. Leave your answer in terms of \(\log\).

Solution

\(-\log(0.9) - \log(0.8) - \log(0.8) = -\log(0.9) - 2\log(0.8)\).

By the binary cross-entropy formula:

\[\ell(\vec h, \vec y) = -\sum_{k=1}^{3}\begin{cases} \log h_k, & \text{if } y_k = 1 \\ \log(1 - h_k), & \text{if } y_k = 0 \end{cases}\]

Evaluating each term:

$$\begin{align*} k = 1&: \quad y_1 = 1, \text{ so } -\log(0.9) \\ k = 2&: \quad y_2 = 0, \text{ so } -\log(1 - 0.2) = -\log(0.8) \\ k = 3&: \quad y_3 = 1, \text{ so } -\log(0.8) \end{align*}$$

The total is \(-\log(0.9) - 2\log(0.8)\).

Problem #170

Tags: binary cross-entropy, lecture-16, multiple outputs

A multi-label classifier has 4 output nodes with sigmoid activations. The true labels are \(\vec y = (0, 1, 0, 1)\) and the predicted probabilities are \(\vec h = (0.3, 0.7, 0.1, 0.9)\).

Compute the binary cross-entropy loss. Leave your answer in terms of \(\log\).

Solution

\(-2\log(0.7) - 2\log(0.9)\).

By the binary cross-entropy formula:

\[\ell(\vec h, \vec y) = -\sum_{k=1}^{4}\begin{cases} \log h_k, & \text{if } y_k = 1 \\ \log(1 - h_k), & \text{if } y_k = 0 \end{cases}\]

Evaluating each term:

$$\begin{align*} k = 1&: \quad y_1 = 0, \text{ so } -\log(1 - 0.3) = -\log(0.7) \\ k = 2&: \quad y_2 = 1, \text{ so } -\log(0.7) \\ k = 3&: \quad y_3 = 0, \text{ so } -\log(1 - 0.1) = -\log(0.9) \\ k = 4&: \quad y_4 = 1, \text{ so } -\log(0.9) \end{align*}$$

The total is \(-2\log(0.7) - 2\log(0.9)\).

Problem #171

Tags: binary cross-entropy, lecture-16, multiple outputs

A multi-label classifier has 3 output nodes with sigmoid activations. The true labels are \(\vec y = (1, 1, 0)\) and the predicted probabilities are \(\vec h = (0.8, 0.6, 0.4)\).

Compute the binary cross-entropy loss. Leave your answer in terms of \(\log\).

Solution

\(-\log(0.8) - 2\log(0.6)\).

By the binary cross-entropy formula:

\[\ell(\vec h, \vec y) = -\sum_{k=1}^{3}\begin{cases} \log h_k, & \text{if } y_k = 1 \\ \log(1 - h_k), & \text{if } y_k = 0 \end{cases}\]

Evaluating each term:

$$\begin{align*} k = 1&: \quad y_1 = 1, \text{ so } -\log(0.8) \\ k = 2&: \quad y_2 = 1, \text{ so } -\log(0.6) \\ k = 3&: \quad y_3 = 0, \text{ so } -\log(1 - 0.4) = -\log(0.6) \end{align*}$$

The total is \(-\log(0.8) - 2\log(0.6)\).